STATISTICAL INFERENCE OF EXPONENTIAL RECORD DATA UNDER KULLBACK-LEIBLER DIVERGENCE MEASURE

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Kullback–Leibler divergence and the Pareto–Exponential approximation

Recent radar research interests in the Pareto distribution as a model for X-band maritime surveillance radar clutter returns have resulted in analysis of the asymptotic behaviour of this clutter model. In particular, it is of interest to understand when the Pareto distribution is well approximated by an Exponential distribution. The justification for this is that under the latter clutter model ...

متن کامل

Texture Similarity Measure Using Kullback-Leibler Divergence between Gamma Distributions

We propose a texture similarity measure based on the Kullback-Leibler divergence between gamma distributions (KLGamma). We conjecture that the spatially smoothed Gabor filter magnitude responses of some classes of visually homogeneous stochastic textures are gamma distributed. Classification experiments with disjoint test and training images, show that the KLGamma measure performs better than o...

متن کامل

Kullback-Leibler Divergence Measure for Multivariate Skew-Normal Distributions

The aim of this work is to provide the tools to compute the well-known Kullback–Leibler divergence measure for the flexible family of multivariate skew-normal distributions. In particular, we use the Jeffreys divergence measure to compare the multivariate normal distribution with the skew-multivariate normal distribution, showing that this is equivalent to comparing univariate versions of these...

متن کامل

Rényi Divergence and Kullback-Leibler Divergence

Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the Rényi divergence of order 1 equals the Kullback-Leibl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Statistics in Transition New Series

سال: 2019

ISSN: 1234-7655,2450-0291

DOI: 10.21307/stattrans-2019-011